Securing Distributed SGD Against Gradient Leakage Threats

نویسندگان

چکیده

This paper presents a holistic approach to gradient leakage resilient distributed Stochastic Gradient Descent (SGD). First , we analyze two types of strategies for privacy-enhanced federated learning: (i) pruning with random selection or low-rank filtering and (ii) perturbation additive noise differential privacy noise. We the inherent limitations these approaches their underlying impact on guarantee, model accuracy, attack resilience. xmlns:xlink="http://www.w3.org/1999/xlink">Next present securing SGD in learning, controlled as tool. Unlike conventional methods per-client injection fixed parameter strategy, our keeps track trend per-example updates. It makes adaptive closely aligned throughout training. xmlns:xlink="http://www.w3.org/1999/xlink">Finally provide an empirical analysis utility, resilience proposed approach. Extensive evaluation using five benchmark datasets demonstrates that can outperform state-of-the-art competitive accuracy performance, strong high against attacks.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Securing Computation against Continuous Leakage

We present a general method to compile any cryptographic algorithm into one which resists side channel attacks of the only computation leaks information variety for an unbounded number of executions. Our method uses as a building block a semantically secure subsidiary bit encryption scheme with the following additional operations: key refreshing, oblivious generation of cipher texts, leakage re...

متن کامل

Defending against insider threats and internal data leakage

In the last decade, computer science researchers have been working hard to prevent attacks against the security of information systems. Different adversary models have incarnated the malicious entities against which researchers have defined security properties, identified security vulnerabilities, and engineered security defenses. These adversaries were usually intruders, that is, outsiders try...

متن کامل

Revisiting Distributed Synchronous SGD

Distributed training of deep learning models on large-scale training data is typically conducted with asynchronous stochastic optimization to maximize the rate of updates, at the cost of additional noise introduced from asynchrony. In contrast, the synchronous approach is often thought to be impractical due to idle time wasted on waiting for straggling workers. We revisit these conventional bel...

متن کامل

Addressing Insider Threats and Information Leakage

Insider threats are one of the problems of organizational security that are most difficult to handle. It is often unclear whether or not an actor is an insider, or what we actually mean by “insider”. It also is often impossible to determine whether an insider action is permissible, or whether it constitutes an insider attack. From a technical standpoint, the biggest concern is the discriminatio...

متن کامل

QSGD: Communication-Efficient SGD via Gradient Quantization and Encoding

Parallel implementations of stochastic gradient descent (SGD) have received significant research attention, thanks to its excellent scalability properties. A fundamental barrier when parallelizing SGD is the high bandwidth cost of communicating gradient updates between nodes; consequently, several lossy compresion heuristics have been proposed, by which nodes only communicate quantized gradient...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Parallel and Distributed Systems

سال: 2023

ISSN: ['1045-9219', '1558-2183', '2161-9883']

DOI: https://doi.org/10.1109/tpds.2023.3273490